Extended Gauss-Newton and ADMM-Gauss-Newton algorithms for low-rank matrix optimization

نویسندگان

چکیده

In this paper, we develop a variant of the well-known Gauss-Newton (GN) method to solve class nonconvex optimization problems involving low-rank matrix variables. As opposed standard GN method, our algorithm allows one handle general smooth convex objective function. We show, under mild conditions, that proposed globally and locally converges stationary point original problem. also show empirically achieves higher accurate solutions than alternating minimization (AMA). Then, specify scheme symmetric case prove its convergence, where AMA is not applicable. Next, incorporate into direction multipliers (ADMM) an ADMM-GN algorithm. that, conditions proper choice penalty parameter, Finally, provide several numerical experiments illustrate algorithms. Our results new algorithms have encouraging performance compared existing methods.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low Complexity Damped Gauss-Newton Algorithms for CANDECOMP/PARAFAC

The damped Gauss-Newton (dGN) algorithm for CANDECOMP/PARAFAC (CP) decomposition can handle the challenges of collinearity of factors and different magnitudes of factors; nevertheless, for factorization of an N-D tensor of size I1 × · · · × IN with rank R, the algorithm is computationally demanding due to construction of large approximate Hessian of size (RT × RT ) and its inversion where T = n...

متن کامل

An Efficient Gauss-Newton Algorithm for Symmetric Low-Rank Product Matrix Approximations

Abstract. We derive and study a Gauss-Newton method for computing a symmetric low-rank product XXT, where X ∈ Rn×k for k < n, that is the closest to a given symmetric matrix A ∈ Rn×n in Frobenius norm. When A = BTB (or BBT), this problem essentially reduces to finding a truncated singular value decomposition of B. Our Gauss-Newton method, which has a particularly simple form, shares the same or...

متن کامل

A Gauss - Newton method for convex composite optimization 1

An extension of the Gauss-Newton method for nonlinear equations to convex composite optimization is described and analyzed. Local quadratic convergence is established for the minimization of h o F under two conditions, namely h has a set of weak sharp minima, C, and there is a regular point of the inclusion F(x) E C. This result extends a similar convergence result due to Womersley (this journa...

متن کامل

A Gauss-Newton method for convex composite optimization

An extension of the Gauss{Newton method for nonlinear equations to convex composite optimization is described and analyzed. Local quadratic convergence is established for the minimization of h F under two conditions, namely h has a set of weak sharp minima, C, and there is a regular point of the inclusion F(x) 2 C. This result extends a similar convergence result due to Womersley which employs ...

متن کامل

Practical Gauss-Newton Optimisation for Deep Learning

The curvature matrix depends on the specific optimisation method and will often be only an estimate. For notational simplicity, the dependence of f̂ on θ is omitted. Setting C to the true Hessian matrix of f would make f̂ the exact secondorder Taylor expansion of the function around θ. However, when f is a nonlinear function, the Hessian can be indefinite, which leads to an ill-conditioned quadra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of applied and numerical optimization

سال: 2021

ISSN: ['2562-5527', '2562-5535']

DOI: https://doi.org/10.23952/jano.3.2021.1.08